A Newton Frank–Wolfe method for constrained self-concordant minimization

نویسندگان

چکیده

We develop a new Newton Frank–Wolfe algorithm to solve class of constrained self-concordant minimization problems using linear oracles (LMO). Unlike L-smooth convex functions, where the Lipschitz continuity objective gradient holds globally, functions only has local bounds, making it difficult estimate number oracle (LMO) calls for underlying optimization algorithm. Fortunately, we can still prove that LMO our method is nearly same as standard Frank-Wolfe in case. Specifically, requires at most $${\mathcal {O}}\big (\varepsilon ^{-(1 + \nu )}\big )$$ LMO’s, $$\varepsilon $$ desired accuracy, and $$\nu \in (0, 0.139)$$ given constant depending on chosen initial point proposed Our intensive numerical experiments three applications: portfolio design with competitive ratio, D-optimal experimental design, logistic regression elastic-net regularizer, show outperforms different state-of-the-art competitors.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Randomized block proximal damped Newton method for composite self-concordant minimization

In this paper we consider the composite self-concordant (CSC) minimization problem, which minimizes the sum of a self-concordant function f and a (possibly nonsmooth) proper closed convex function g. The CSC minimization is the cornerstone of the path-following interior point methods for solving a broad class of convex optimization problems. It has also found numerous applications in machine le...

متن کامل

Composite self-concordant minimization

We propose a variable metric framework for minimizing the sum of a self-concordant function and a possibly non-smooth convex function, endowed with an easily computable proximal operator. We theoretically establish the convergence of our framework without relying on the usual Lipschitz gradient assumption on the smooth part. An important highlight of our work is a new set of analytic step-size ...

متن کامل

Quasi-Newton acceleration for equality-constrained minimization

Optimality (or KKT) systems arise as primal-dual stationarity conditions for constrained optimization problems. Under suitable constraint qualifications, local minimizers satisfy KKT equations but, unfortunately, many other stationary points (including, perhaps, maximizers) may solve these nonlinear systems too. For this reason, nonlinear-programming solvers make strong use of the minimization ...

متن کامل

Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods

We study the smooth structure of convex functions by generalizing a powerful concept so-called self-concordance introduced by Nesterov and Nemirovskii in the early 1990s to a broader class of convex functions, which we call generalized self-concordant functions. This notion allows us to develop a unified framework for designing Newton-type methods to solve convex optimization problems. The prop...

متن کامل

A Modified Newton Method for Minimization I

Some promising ideas for minimizing a nonlinear function, whose first and second derivatives are given, by a modified Newton method, were introduced by Fiacco and McCormick (Ref. 1). Unfortunately, in developing a method around these ideas, Fiacco and McCormick used a potentially unstable, or even impossible, matrix factorization. Using some recently developed techniques for factorizing an inde...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Global Optimization

سال: 2021

ISSN: ['1573-2916', '0925-5001']

DOI: https://doi.org/10.1007/s10898-021-01105-z